An Inequality with Applications to Structured Sparsity and Multitask Dictionary Learning

نویسندگان

  • Andreas Maurer
  • Massimiliano Pontil
  • Bernardino Romera-Paredes
چکیده

From concentration inequalities for the suprema of Gaussian or Rademacher processes an inequality is derived. It is applied to sharpen existing and to derive novel bounds on the empirical Rademacher complexities of unit balls in various norms appearing in the context of structured sparsity and multitask dictionary learning or matrix factorization. A key role is played by the largest eigenvalue of the data covariance matrix.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Speech Enhancement using Adaptive Data-Based Dictionary Learning

In this paper, a speech enhancement method based on sparse representation of data frames has been presented. Speech enhancement is one of the most applicable areas in different signal processing fields. The objective of a speech enhancement system is improvement of either intelligibility or quality of the speech signals. This process is carried out using the speech signal processing techniques ...

متن کامل

Online Dictionary Learning with Group Structure Inducing Norms

• Sparse coding. • Structured sparsity (e.g., disjunct groups, trees): increased performance in several applications. • Our goal: develop a dictionary learning method, which – enables general overlapping group structures, – is online: fast, memory efficient, adaptive, – applies non-convex sparsity inducing regularization: ∗ fewer measurements, ∗ weaker conditions on the dictionary, ∗ robust (w....

متن کامل

Structured Sparse Principal Component Analysis

We present an extension of sparse PCA, or sparse dictionary learning, where the sparsity patterns of all dictionary elements are structured and constrained to belong to a prespecified set of shapes. This structured sparse PCA is based on a structured regularization recently introduced by [1]. While classical sparse priors only deal with cardinality, the regularization we use encodes higher-orde...

متن کامل

Task-Driven Dictionary Learning for HyperspectralImage Classification with Structured SparsityConstraints Sparse representation models a signal as a linear combination of a small number of dictionary

Task-Driven Dictionary Learning for HyperspectralImage Classification with Structured SparsityConstraints Report Title Sparse representation models a signal as a linear combination of a small number of dictionary atoms. As a generative model, it requires the dictionary to be highly redundant in order to ensure both a stable high sparsity level and a low reconstruction error for the signal. Howe...

متن کامل

Structured Dictionary Learning for Classification

Sparsity driven signal processing has gained tremendous popularity in the last decade. At its core, the assumption is that the signal of interest is sparse with respect to either a fixed transformation or a signal dependent dictionary. To better capture the data characteristics, various dictionary learning methods have been proposed for both reconstruction and classification tasks. For classifi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014